Robust polynomial time tensor decomposition

نویسنده

  • Sam Hopkins
چکیده

Tensor decomposition has recently become an invaluable algorithmic primitive. It has seen much use in new algorithms with provable guarantees for fundamental statistics and machine learning problems. In these settings, some low-rank k-tensor A ∑r i 1 a ⊗k i which wewould like to decompose into components a1, . . . , ar ∈ ’n is often not directly accessible. This could happen for many reasons; a common one is that A …X⊗k for some random variable X, and estimating A to high precision may require too many independent samples from X. In this lecture we will dig in to algorithms for robust tensor decomposition—that is, how to accomplish tensor decomposition efficiently in the presence of errors. Wewill focus on orthogonal tensor decompositionwhere components a1, . . . , ar ∈ ’n of the tensor A ∑r i 1 a ⊗k i to be decomposed are orthogonal unit vectors. Tensor decomposition is already both algorithmically nontrivial and quite useful in this setting—the orthogonal setting is good enough to give the best known algorithms for Gaussianmixtures, some kinds of dictionary learning, and the stochastic blockmodel. As we saw before, viawhitening if the covariance matrix ∑n i 1 ai a > i is known for non-orthogonl but linearly independent vectors a1, . . . , an then decomposing the tensor ∑n i 1 a ⊗3 i reduces to orthogonal decomposition.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Uniqueness of Tensor Decompositions with Applications to Polynomial Identifiability

We give a robust version of the celebrated result of Kruskal on the uniqueness of tensor decompositions: we prove that given a tensor whose decomposition satisfies a robust form of Kruskal’s rank condition, it is possible to approximately recover the decomposition if the tensor is known up to a sufficiently small (inverse polynomial) error. Kruskal’s theorem has found many applications in provi...

متن کامل

Low-Rank Approximation and Completion of Positive Tensors

Unlike the matrix case, computing low-rank approximations of tensors is NP-hard and numerically ill-posed in general. Even the best rank-1 approximation of a tensor is NP-hard. In this paper, we use convex optimization to develop polynomial-time algorithms for low-rank approximation and completion of positive tensors. Our approach is to use algebraic topology to define a new (numerically well-p...

متن کامل

Fast and robust tensor decomposition with applications to dictionary learning

We develop fast spectral algorithms for tensor decomposition that match the robustness guarantees of the best known polynomial-time algorithms for this problem based on the sum-of-squares (SOS) semidefinite programming hierarchy. Our algorithms can decompose a 4-tensor with n-dimensional orthonormal components in the presence of error with constant spectral norm (when viewed as an n2-by-n2 matr...

متن کامل

Planted Sparse Vector , Tensor Decomposition and Poly - nomial Optimization

On the other hand, better bounds can be achieved under stronger assumptions for both problems. Specifically, if we assume d = O ( √ n √ logn ) , cn-sparse planted vectors can be recovered by a polynomial-time algorithm for some small constant c. For tensor decomposition with ai ∼ i.i.d.N (0, Id), polynomial-time algorithm exists in the regime of n ≤ d 3/2 polylog(n) . An interesting fact is tha...

متن کامل

Decomposing Overcomplete 3rd Order Tensors using Sum-of-Squares Algorithms

Tensor rank and low-rank tensor decompositions have many applications in learning and complexity theory. Most known algorithms use unfoldings of tensors and can only handle rank up to nbp/2c for a p-th order tensor in Rnp . Previously no efficient algorithm can decompose 3rd order tensors when the rank is super-linear in the dimension. Using ideas from sum-of-squares hierarchy, we give the firs...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017